Browsing by Author "Mason, Scott"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
- ItemOpen AccessAnalysis of photogrammetrically-derived digital surface and terrain models for building recognition(1997) Mtshatsha, Bandile; Mason, ScottBuildings are one of the most frequently occurring man-made objects and in urban scenes their detection and reconstruction, e.g., in the form of three-dimensional CAD (computer aided design) models, is very important to many users such as architects, town planners and telecommunications and environmental engineers. This thesis examines the role of digital terrain and surface models in supporting this reconstruction process. The thesis is structured into four main parts, namely, image matching to derive the data sets, building detection to delineate buildings from other man-made objects in DSM (digital surface model), DSM quality analysis to determine the reliability of the data, hydrological analysis to determine flood zones as an additional example of DTM application and finally conclusions and possible future outlook. Image matching was performed using an in-house image matching software in the Geomatics department. Off-the-shelf GIS functionality was used to tackle building detection, DSM quality analysis and hydrological analysis. A key feature of GIS functionality is the ability to exploit standard functions for the input/output, management, spatial analysis, editing and visualisation. It also aims at enhancing the accessibility of developed tools to end users.
- ItemOpen AccessApplication of information systems in irregular settlement management and low-cost housing provision(1998) Crone, Simon Michael; Mason, ScottInformation Systems, both paper-based and computer-based, are integral in the management of irregular settlements and the process of delivering low-cost housing in South Africa. An Irregular Settlement can be defined as an area where the 'shacks' have no fixed street address. Due to policies by previous regimes, under whose rule irregular settlements were almost ignored, there is often little or no spatial or socio-economic data available about existing irregular settlements. Thus for the use of the community, or to organisations interested in helping to improve the quality of life of the residents living in these settlements. As a prerequisite to quality of life, the basic need of shelter, along with food, healthcare and education need to be made available. The emphasis today is thus being placed on the provision of low-cost housing. A need thus arises to have up-to-date information about these irregular settlements in order to plan either for the upgrading of the settlement or for the relocation to new low-cost housing developments. Currently mostly paper-based systems are being used in these developments. There are two opportunities where computer-oriented information systems could be used at this time in 1996 and 1997 to assist with the management and upgrading of irregular settlements. The first is the stage of managing an existing irregular settlement; the second is managing the process of housing provision, taking advantage of the project-linked subsidy scheme. Two Cape Town based projects provide case studies for the application of information systems at the two stages identified above. The first is the Marconi Beam 'From Shacks to Houses' project located in Milnerton. The second is the Integrated Services Land Project (iSLP) of the Cape Flats. The Marconi Beam Settlement is an irregular settlement that has been accepted as part of the 'Project-Linked Subsidy Scheme' for the provision of new low-cost housing. Previously only paper-based systems were being used to manage the settlement and its move to the new Joe Slovo Park formal housing development. There was also found to be a lack of appropriate tools and awareness of which technology could be used in the process. Some of the specific application areas in which we were able to provide solutions in Marconi Beam included: ■ the identification of people directly affected by the fire that swept through the settlement in October 1996; ■ the residents who would be affected by the construction of a new road through the one area of the settlement could be identified, facilitating their movement away from the area; and ■ a system of tracking the internal moves of residents was devised by which we were able to maintain a record of the internal movements of residents whilst the system of the lottery was in place. Subsequently, with the use of the Block System, the identification of residents who were required to come in and have their applications for new houses processed, as a result of their spatial location in the settlement, was accomplished. The Indlu Management System, a computer based system, resulted from the need to keep track of, and process, large amounts of socio-economic data in order to speedily process the large number of applicants applying for national housing subsidies. As a result of the implementation of this system, the processing times per applicant have been reduced from 30 minutes to 10 minutes per applicant. The successful use of these systems in the two projects demonstrate that there is thus a definite role to be played in the use of information systems in relation to the management of irregular settlements and the provision of low-cost housing.
- ItemOpen AccessInvestigation of the effects of image compression on the geometric quality of digital protogrammetric imagery(1997) Kwabena-Forkuo, Eric; Mason, ScottWe are living in a decade, where the use of digital images is becoming increasingly important. Photographs are now converted into digital form, and direct acquisition of digital images is becoming increasing important as sensors and associated electronics. Unlike images in analogue form, digital representation of images allows visual information to· be easily manipulated in useful ways. One practical problem of the digital image representation is that, it requires a very large number of bits and hence one encounters a fairly large volume of data in a digital production environment if they are stored uncompressed on the disk. With the rapid advances in sensor technology and digital electronics, the number of bits grow larger in softcopy photogrammetry, remote sensing and multimedia GIS. As a result, it is desirable to find efficient representation for digital images in order to reduce the memory required for storage, improve the data access rate from storage devices, and reduce the time required for transfer across communication channels. The component of digital image processing that deals with this problem is called image compression. Image compression is a necessity for the utilisation of large digital images in softcopy photogrammetry, remote sensing, and multimedia GIS. Numerous image Compression standards exist today with the common goal of reducing the number of bits needed to store images, and to facilitate the interchange of compressed image data between various devices and applications. JPEG image compression standard is one alternative for carrying out the image compression task. This standard was formed under the auspices ISO and CCITT for the purpose of developing an international standard for the compression and decompression of continuous-tone, still-frame, monochrome and colour images. The JPEG standard algorithm &Us into three general categories: the baseline sequential process that provides a simple and efficient algorithm for most image coding applications, the extended DCT-based process that allows the baseline system to satisfy a broader range of applications, and an independent lossless process for application demanding that type of compression. This thesis experimentally investigates the geometric degradations resulting from lossy JPEG compression on photogrammetric imagery at various levels of quality factors. The effects and the suitability of JPEG lossy image compression on industrial photogrammetric imagery are investigated. Examples are drawn from the extraction of targets in close-range photogrammetric imagery. In the experiments, the JPEG was used to compress and decompress a set of test images. The algorithm has been tested on digital images containing various levels of entropy (a measure of information content of an image) with different image capture capabilities. Residual data was obtained by taking the pixel-by-pixel difference between the original data and the reconstructed data. The image quality measure, root mean square (rms) error of the residual was used as a quality measure to judge the quality of images produced by JPEG(DCT-based) image compression technique. Two techniques, TIFF (IZW) compression and JPEG(DCT-based) compression are compared with respect to compression ratios achieved. JPEG(DCT-based) yields better compression ratios, and it seems to be a good choice for image compression. Further in the investigation, it is found out that, for grey-scale images, the best compression ratios were obtained when the quality factors between 60 and 90 were used (i.e., at a compression ratio of 1:10 to 1:20). At these quality factors the reconstructed data has virtually no degradation in the visual and geometric quality for the application at hand. Recently, many fast and efficient image file formats have also been developed to store, organise and display images in an efficient way. Almost every image file format incorporates some kind of compression method to manage data within common place networks and storage devices. The current major file formats used in softcopy photogrammetry, remote sensing and · multimedia GIS. were also investigated. It was also found out that the choice of a particular image file format for a given application generally involves several interdependent considerations including quality; flexibility; computation; storage, or transmission. The suitability of a file format for a given purpose is · best determined by knowing its original purpose. Some of these are widely used (e.g., TIFF, JPEG) and serve as exchange formats. Others are adapted to the needs of particular applications or particular operating systems.
- ItemOpen AccessA line photogrammetry algorithm for 3D rectilinear object reconstruction(1998) Hill, Justin John Whatton; Mason, Scott; Rüther, HeinzThis thesis introduces an alternative formulation for line photogrammetry. The aim was to develop and test a method of computing the position and orientation of a straight line in space using two or more oriented images of that line. The algorithm presented is intended for object reconstruction and is motivated by the need to reconstruct man-made objects in urban areas, such as buildings and the industrial inspection arena. The method aims to obtain a best-fit line through a "pencil of planes". The reconstructed 3D line is defined by two points as opposed to the conventional representation, which uses a point and a direction vector. The approach to this problem involves the calculation of a projection plane for each image containing the perspective centre and two transformed line-point observations in the image. A least squares adjustment involves fitting a straight line as near as possible to the projection planes from all images simultaneously. The adjusted line is referred to as a best-fitting line through a "pencil of planes" (POP). In this project, a mathematical model was formulated for the application of this concept. This algorithm was coded and tested on two cases. A set of scanned aerial images of a residential area with a scale of 1: 5000 provided the primary test case. Lines delineating three roofs visible in the aerial images were reconstructed using the POP method and compared with ground truth data. The lines reconstructed using the POP method were compared to those reconstructed using an existing method of line photogrammetry, proposed by Mulawa (1988). The second test was based on a set of close-range images captured using a small-format digital camera. Lines delineating the bars of a metal frame generally used as a precise control field for camera calibration, were reconstructed. In both test cases, X² tests were applied, and the standard deviations calculated. In the aerial case, standard deviations obtained were generally in the region of about 5cm. The ground resolution of the images was 7.Scm. In the close-range case the ground resolution was approximately 1.3mm, and the standard deviations obtained were generally of the order of 0.7mm. Of the lines computed, 84% of the adjustments passed the X² test. The results obtained confirmed that the POP algorithm is a practicable means of adjusting observations to obtain best-fitting 3D lines using observations made in a set of oriented images.
- ItemOpen AccessPattern recognition and the nondeterminable affine parameter problem(1998) Geffen, Nathan; Mason, ScottThis thesis reports on the process of implementing pattern recognition systems using classification models such as artificial neural networks (ANNs) and algorithms whose theoretical foundations come from statistics. The issues involved in implementing several classification models and pre-processing operators - that are applied to patterns before classification takes place - are discussed and a methodology that is commonly used in developing pattern recognition systems is described. In addition, a number of pattern recognition systems for two image recognition problems that occur in the field of image matching have been developed. These image recognition problems and the issues involved in solving them are described in detail. Numerous experiments were carried out to test the accuracy and speed of the systems developed to solve these problems. These experiments and their results are also discussed.